62 research outputs found

    New methods for travel time estimation on freeway sections

    Get PDF
    In this paper we present two novel approaches to estimate the travel times between subsequent detector stations in a freeway network, with long distances between detector stations and several unobserved on- and off-ramps. The network under investigation is a two-lane freeway. The maximum distance between detector stations, for which travel times were estimated is about 20 km with four unobserved on- and off-ramps in between. The algorithms were applied on real data sets, which has led to reasonable estimates. However, due to unknown actual ('true') travel times, a performance assessment was not possible. The algorithms were also applied on simulated data with known travel times. This allowed the verification of the estimated travel times. The simulated data were generated by the microscopic traffic simulation tool AIMSUN NG®. The detector stations were assumed to be equipped with widespread double loop detectors, i.e., for each vehicle, the only information used was its length (with a superimposed measurement noise) and the arrival time at the detector stations. The estimated travel times show that with both methods all relevant travel time characteristics were correctly identified for the investigated scenarios. Moreover, a comparison of the estimates with the actual travel times has shown very good accuracy. Besides the fact that the methods work well even under hindered conditions (long distance, unobserved ramps), some additional practical benefits are: provided that single car data are available with sufficient accuracy, no additional investments are required; both methods work fully anonymous; extensions to more sophisticated detection technologies that provide additional vehicle features are straightforward; the travel time estimates form a good basis for any travel time prediction method.In diesem Beitrag stellen wir zwei neue Verfahren zur Reisezeitschätzung zwischen aufeinander folgenden Zählstellen auf Autobahnen vor. Das dabei untersuchte Netzwerk ist eine zweispurige Autobahn. Die maximale Distanz zwischen den Zählstellen, für welche Reisezeitschätzungen erfolgen, ist dabei bis zu 20 km lang mit je vier nicht beobachteten Ein- und Ausfahrten dazwischen. Die Algorithmen wurden auf reale Daten angewendet, wobei sich plausible Reisezeiten ergaben. Ein Problem war dabei jedoch, dass zum tatsächlichen Vergleich keine "wahren" Reisezeiten zur Verfügung standen. Da Vergleichswerte für die Bewertung der Verfahren unumgänglich sind, wurden verschiedene Verkehrssituationen mit dem Mikrosimulationstool AIMSUN NG® simuliert. Dadurch waren genaue Referenzdaten verfügbar und die Schätzungen konnten damit verglichen werden. Aufgrund der Tatsache, dass die Zählstellen auf dem Schweizer Nationalstrassennetz vorwiegend mit Doppelinduktionsschleifen ausgerüstet sind, wurde dies auch für die simulierten Szenarien angenommen. Dies wurde erreicht, indem zur tatsächlichen Fahrzeuglänge für jedes Fahrzeug an jeder Zählstelle jeweils ein zufälliger Messfehler addiert wurde. Als Input für die Verfahren wurden also für jedes genierte Fahrzeug an den Zählstellen lediglich die fehlerbehaftete Länge und der Durchfahrtszeitpunkt ermittelt. Beide Verfahren konnten für die untersuchten Szenarien alle relevanten Reisezeit-Charakteristiken korrekt identifizieren. Zudem zeigte ein Vergleich mit den Referenzdaten, dass sich für beide Verfahren sehr gute Genauigkeiten ergaben. Nebst der Tatsache, dass die Methoden auch unter erschwerten Bedingungen (grosse Distanzen, unbeobachtete Ein- und Ausfahrten) funktionieren, ergeben sich einige zusätzliche Vorteile: Vorausgesetzt, dass Einzelfahrzeugdaten in ausreichender Qualität zur Verfügung stehen, sind keine zusätzlichen Infrastruktur-Investitionen nötig. Desweiteren sind beide Verfahren vollständig anonym, es sind also keine Rückschlüsse auf einzelne Fahrzeuge möglich. Beide Verfahren erlauben es zudem, Messdaten von Verfahren zu nutzen, welche weitere Fahrzeug-Eigenschaften extrahieren können (z.B. Höhe, Breite). Schliesslich bilden die mit diesen Verfahren gewonnenen Reisezeitschätzungen eine gute Basis für nachfolgende Algorithmen zur Reisezeitprognose.In this paper we present two novel approaches to estimate the travel times between subsequent detector stations in a freeway network, with long distances between detector stations and several unobserved on- and off-ramps. The network under investigation is a two-lane freeway. The maximum distance between detector stations, for which travel times were estimated is about 20 km with four unobserved on- and off-ramps in between. The algorithms were applied on real data sets, which has led to reasonable estimates. However, due to unknown actual ('true') travel times, a performance assessment was not possible. The algorithms were also applied on simulated data with known travel times. This allowed the verification of the estimated travel times. The simulated data were generated by the microscopic traffic simulation tool AIMSUN NG®. The detector stations were assumed to be equipped with widespread double loop detectors, i.e., for each vehicle, the only information used was its length (with a superimposed measurement noise) and the arrival time at the detector stations. The estimated travel times show that with both methods all relevant travel time characteristics were correctly identified for the investigated scenarios. Moreover, a comparison of the estimates with the actual travel times has shown very good accuracy. Besides the fact that the methods work well even under hindered conditions (long distance, unobserved ramps), some additional practical benefits are: provided that single car data are available with sufficient accuracy, no additional investments are required; both methods work fully anonymous; extensions to more sophisticated detection technologies that provide additional vehicle features are straightforward; the travel time estimates form a good basis for any travel time prediction method

    A new method for travel time estimation on long freeway sections

    Get PDF
    The knowledge of travel times on road networks is of vital importance for network operators as well as for drivers. Operators can use travel time information to improve control on their networks. Drivers or transport companies can choose their optimal route based on the traffic information available and their individual preferences. The presented approach focuses on travel time estimation. The method only requires the time stamps and vehicle lengths captured at subsequent detector stations. The distance between the stations is up to 13 km. The examined network is a two-lane freeway with three unobserved on- and off-ramps each. In order to investigate the influence of measurement errors and to have true data for comparison, simulated detector data based on an existing Swiss freeway section equipped with loop detectors was used. The method shows very good performance for the investigated scenarios. All relevant characteristics of the travel time process were detected and estimation errors were within a well acceptable range. Compared to existing travel time estimation methods, the presented approach considerably extends the maximum distance for which travel time estimations can be carried out

    RACE: Remote Analysis Computation for gene Expression data

    Get PDF
    The Remote Analysis Computation for gene Expression data (RACE) suite is a collection of bioinformatics web tools designed for the analysis of DNA microarray data. RACE performs probe-level data preprocessing, extensive quality checks, data visualization and data normalization for Affymetrix GeneChips. In addition, it offers differential expression analysis on normalized expression levels from any array platform. RACE estimates the false discovery rates of lists of potentially regulated genes and provides a Gene Ontology-term analysis tool for GeneChip data to support the biological interpretation and annotation of results. The analysis is fully automated but can be customized by flexible parameter settings. To offer a convenient starting point for subsequent analyses, and to provide maximum transparency, the R scripts used to generate the results can be downloaded along with the output files. RACE is freely available for use at

    Estimating Conditional Distributions with Neural Networks using R package deeptrafo

    Full text link
    Contemporary empirical applications frequently require flexible regression models for complex response types and large tabular or non-tabular, including image or text, data. Classical regression models either break down under the computational load of processing such data or require additional manual feature extraction to make these problems tractable. Here, we present deeptrafo, a package for fitting flexible regression models for conditional distributions using a tensorflow backend with numerous additional processors, such as neural networks, penalties, and smoothing splines. Package deeptrafo implements deep conditional transformation models (DCTMs) for binary, ordinal, count, survival, continuous, and time series responses, potentially with uninformative censoring. Unlike other available methods, DCTMs do not assume a parametric family of distributions for the response. Further, the data analyst may trade off interpretability and flexibility by supplying custom neural network architectures and smoothers for each term in an intuitive formula interface. We demonstrate how to set up, fit, and work with DCTMs for several response types. We further showcase how to construct ensembles of these models, evaluate models using inbuilt cross-validation, and use other convenience functions for DCTMs in several applications. Lastly, we discuss DCTMs in light of other approaches to regression with non-tabular data

    Probabilistic Short-Term Low-Voltage Load Forecasting using Bernstein-Polynomial Normalizing Flows

    Get PDF
    The transition to a fully renewable energy grid requires better forecasting of demand at the low-voltage level. However, high fluctuations and increasing electrification cause huge forecast errors with traditional point estimates. Probabilistic load forecasts take future uncertainties into account and thus enables various applications in low-carbon energy systems. We propose an approach for flexible conditional density forecasting of short-term load based on Bernstein-Polynomial Normalizing Flows where a neural network controls the parameters of the flow. In an empirical study with 363 smart meter customers, our density predictions compare favorably against Gaussian and Gaussian mixture densities and also outperform a non-parametric approach based on the pinball loss for 24h-ahead load forecasting for two different neural network architectures

    Deep transformation models for functional outcome prediction after acute ischemic stroke

    Full text link
    In many medical applications, interpretable models with high prediction performance are sought. Often, those models are required to handle semi-structured data like tabular and image data. We show how to apply deep transformation models (DTMs) for distributional regression which fulfill these requirements. DTMs allow the data analyst to specify (deep) neural networks for different input modalities making them applicable to various research questions. Like statistical models, DTMs can provide interpretable effect estimates while achieving the state-of-the-art prediction performance of deep neural networks. In addition, the construction of ensembles of DTMs that retain model structure and interpretability allows quantifying epistemic and aleatoric uncertainty. In this study, we compare several DTMs, including baseline-adjusted models, trained on a semi-structured data set of 407 stroke patients with the aim to predict ordinal functional outcome three months after stroke. We follow statistical principles of model-building to achieve an adequate trade-off between interpretability and flexibility while assessing the relative importance of the involved data modalities. We evaluate the models for an ordinal and dichotomized version of the outcome as used in clinical practice. We show that both, tabular clinical and brain imaging data, are useful for functional outcome prediction, while models based on tabular data only outperform those based on imaging data only. There is no substantial evidence for improved prediction when combining both data modalities. Overall, we highlight that DTMs provide a powerful, interpretable approach to analyzing semi-structured data and that they have the potential to support clinical decision making.Comment: Preprint under revie

    Optical Microscopy in the Nano-World

    Get PDF
    Scanning near-field optical microscopy (SNOM) is an optical microscopy whose resolution is not bound to the diffraction limit. It provides chemical information based upon spectral, polarization and/or fluorescence contrast images. Details as small as 20 nm can be recognized. Photophysical and photochemical effects can be studied with SNOM on a similar scale. This article reviews a good deal of the experimental and theoretical work on SNOM in Switzerland

    Unsupervised assessment of microarray data quality using a Gaussian mixture model

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Quality assessment of microarray data is an important and often challenging aspect of gene expression analysis. This task frequently involves the examination of a variety of summary statistics and diagnostic plots. The interpretation of these diagnostics is often subjective, and generally requires careful expert scrutiny.</p> <p>Results</p> <p>We show how an unsupervised classification technique based on the Expectation-Maximization (EM) algorithm and the naïve Bayes model can be used to automate microarray quality assessment. The method is flexible and can be easily adapted to accommodate alternate quality statistics and platforms. We evaluate our approach using Affymetrix 3' gene expression and exon arrays and compare the performance of this method to a similar supervised approach.</p> <p>Conclusion</p> <p>This research illustrates the efficacy of an unsupervised classification approach for the purpose of automated microarray data quality assessment. Since our approach requires only unannotated training data, it is easy to customize and to keep up-to-date as technology evolves. In contrast to other "black box" classification systems, this method also allows for intuitive explanations.</p
    corecore